Less Is More: Towards Compact CNNs

نویسندگان

  • Hao Zhou
  • Jose M. Alvarez
  • Fatih Murat Porikli
چکیده

We take AlexNet [1] as an example to show how we compute the parameter compression and memory footprint reduction of a network. We show the structure of the filters in each convolutional layer and fully connected layer of AlexNet [1] in Table 1. Each neuron is an order-3 tensor, we use width, height and input to show the number of elements for each of the three channels. Output shows the number of neurons in each layer (it is also the number of output features of each layer). Here, suppose AlexNet is used to classify images into 1000 categories, as a result, fc8 contains 1000 output channels. In total the number of parameters is 60, 965, 224. Our purpose is to remove the number of neurons, i.e. the number in “output” column in Table 1. Please note that the number of input of (i + 1)-th layer is related to the number of output in i-th layer. For example, if we can remove n neurons for fc6, the corresponding n channels in input of fc7 will also be removed.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Document Image Features With SqueezeNet Convolutional Neural Network

The classification of various document images is considered an important step towards building a modern digital library or office automation system. Convolutional Neural Network (CNN) classifiers trained with backpropagation are considered to be the current state of the art model for this task. However, there are two major drawbacks for these classifiers: the huge computational power demand for...

متن کامل

Compact Deep Convolutional Neural Networks for Image Classification

Convolutional Neural Network is efficient in learning hierarchical features from large datasets, but its model complexity and large memory foot prints are preventing it from being deployed to devices without a server backend support. Modern CNNs are always trained on GPUs or even GPU clusters with high speed computation power due to the immense size of the network. Methods on regulating the siz...

متن کامل

Espresso: Efficient Forward Propagation for Binary Deep Neural Networks

There are many applications scenarios for which the computational performance and memory footprint of the prediction phase of Deep Neural Networks (DNNs) need to be optimized. Binary Deep Neural Networks (BDNNs) have been shown to be an effective way of achieving this objective. In this paper, we show how Convolutional Neural Networks (CNNs) can be implemented using binary representations. Espr...

متن کامل

Focal-Plane and Multiple Chip VLSI Approaches to CNNs

In this paper, three alternative VLSI analog implementations of CNNs are described, which have been devised to perform image processing and vision tasks: a programmable low-power CNN with embedded photosensors, a compact fixed-template CNN based on unipolar current-mode signals, and basic CMOS circuits to implement an extended CNN model using spikes. The first two VLSI approaches are intended f...

متن کامل

Towards Evolutional Compression

Compressing convolutional neural networks (CNNs) is essential for transferring the success of CNNs to a wide variety of applications to mobile devices. In contrast to directly recognizing subtle weights or filters as redundant in a given CNN, this paper presents an evolutionary method to automatically eliminate redundant convolution filters. We represent each compressed network as a binary indi...

متن کامل

3d-filtermap: a Compact Architecture for Deep Convolutional Neural Networks

We present a novel and compact architecture for deep Convolutional Neural Networks (CNNs) in this paper, termed 3D-FilterMap Convolutional Neural Networks (3D-FM-CNNs). The convolution layer of 3D-FM-CNN learns a compact representation of the filters, named 3D-FilterMap, instead of a set of independent filters in the conventional convolution layer. The filters are extracted from the 3D-FilterMa...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2016